- A Facebook bug promoted misinformation on its News Feed instead of stymieing it in recent months.
- Engineers first noticed the issue in October and resolved the bug on March 11, The Verge reported.
- Facebook has long grappled with critics who say the platform is a major driver of false information that stokes divisiveness.
A software bug has been inadvertently amplifying misinformation on Facebook instead of combatting it, according to a Thursday report from The Verge.
According to an internal report viewed by the outlet, Facebook engineers identified a "massive ranking failure" on its News Feed that heavily distributed posts containing nudity, violence, and false information reviewed by independent third-party fact checkers over the past six months.
It also indexed posts by Russian state-owned media outlets, which have since blocked Facebook after the company restricted the accounts of the publications in response to the country's invasion of Ukraine in February.
The technical issue at the root of the problem was first introduced in 2019, but didn't cause any issues until 2021 when the engineers first noticed it. The issue was resolved on March 11, The Verge reported.
Facebook, now Meta, did not immediately respond to Insider's request for comment. Company spokesperson Joe Osborne confirmed the issue to The Verge and said Meta "detected inconsistencies in downranking on five separate occasions, which correlated with small, temporary increases to internal metrics."
How Facebook ranks content and pushes posts to users has become a contentious topic for the company in recent years.
For example, the platform rolled out an algorithm tweak in 2018 that promoted posts it anticipated users would engage with the most, such as content from friends and family. It was a move designed to keep people scrolling longer, but it also led to higher circulation of violent, false, and politically divisive content.
The move heavily influenced news production as well, forcing publishers to reorient their business models to reach readers, who had proven they were more likely to click on so-called clickbait stories driven by sensationalism.
The "Facebook Papers" from 2021 in part revealed that Facebook employees were concerned that the 2018 algorithm change would indeed elevate political divisiveness and outrage.
The platform took action in August 2021 when it announced it would start downranking political content in people's News Feeds, or reducing the number of political posts that users see. The move marked a departure for Facebook, which has traditionally relied heavily on its ranking algorithm, which decides how likely someone is to share or comment on a certain post based on past engagement.